# Supercomputer training

Bielik 4.5B V3
Apache-2.0
Bielik-4.5B-v3 is a Polish generative text model with 4.6 billion parameters, developed by SpeakLeash in collaboration with ACK Cyfronet AGH, trained on a curated Polish corpus.
Large Language Model Transformers Other
B
speakleash
40
1
Italia 9B Instruct V0.1
MIT
Italia 9B is an open-source large language model developed by iGenius, specifically designed for the Italian language. It understands the linguistic and cultural nuances of Italian and also performs excellently in English and translation tasks.
Large Language Model Transformers
I
iGeniusAI
8,624
53
Llama 3 8B Instruct
Nordic language instruction fine-tuned model based on Llama-3-8B, supporting Swedish, Danish, and Norwegian
Large Language Model Transformers Other
L
AI-Sweden-Models
570
12
Fugaku LLM 13B
Other
Fugaku-LLM is a domestically produced large language model in Japan, pre-trained from scratch using the supercomputer 'Fugaku.' It boasts high transparency and security, with particularly outstanding performance in Japanese.
Large Language Model Transformers Supports Multiple Languages
F
Fugaku-LLM
25
123
Viking 33B
Apache-2.0
Viking 33B is a 33 billion parameter decoder-only Transformer model supporting Finnish, English, and multiple Nordic languages, with capabilities in code understanding and generation.
Large Language Model Transformers Supports Multiple Languages
V
LumiOpen
1,030
25
Viking 13B
Apache-2.0
Viking 13B is a 13-billion-parameter multilingual large model supporting Finnish, English, and other Nordic languages, with code processing capabilities
Large Language Model Transformers Supports Multiple Languages
V
LumiOpen
1,233
12
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase